Weighted Nearest Neighbor Classifiers and First-order Error

نویسندگان

  • Maya R. Gupta
  • William H. Mortensen
چکیده

Weighted nearest-neighbor classification is analyzed in terms of squared error of class probability estimates. Two classes of algorithms for calculating weights are studied with respect to their ability to minimize the first-order term of the squared error: local linear regression and a new class termed regularized linear interpolation. A number of variants of each class are considered or proposed, and compared analytically and by simulations and experiments on benchmark datasets. The experiments establish that weighting methods which aim to minimize first-order error can perform significantly better than standard k-NN, particularly in high-dimensions. Regularization functions, the fitted surfaces, cross-validated neighborhood size, and the effect of high-dimensionality are also analyzed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Ideal bootstrap estimation of expected prediction error for k-nearest neighbor classifiers: Applications for classification and error assessment

Euclidean distance -nearest neighbor ( -NN) classifiers are simple nonparametric classification rules. 5 5 Bootstrap methods, widely used for estimating the expected prediction error of classification rules, are motivated by the objective of calculating the ideal bootstrap estimate of expected prediction error. In practice, bootstrap methods use Monte Carlo resampling to estimate the ideal boot...

متن کامل

Evolving edited k-Nearest Neighbor Classifiers

The k-nearest neighbor method is a classifier based on the evaluation of the distances to each pattern in the training set. The edited version of this method consists of the application of this classifier with a subset of the complete training set in which some of the training patterns are excluded, in order to reduce the classification error rate. In recent works, genetic algorithms have been ...

متن کامل

Error minimizing algorithms for nearest neighbor classifiers

Stack Filters define a large class of discrete nonlinear filter first introduced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which...

متن کامل

Diagnosis of Tempromandibular Disorders Using Local Binary Patterns

Background: Temporomandibular joint disorder (TMD) might be manifested as structural changes in bone through modification, adaptation or direct destruction. We propose to use Local Binary Pattern (LBP) characteristics and histogram-oriented gradients on the recorded images as a diagnostic tool in TMD assessment.Material and Methods: CBCT images of 66 patients (132 joints) with TMD and 66 normal...

متن کامل

Weighted Nearest Neighbor Learning and First-order Error

Weighted nearest-neighbor learning is analyzed in terms of squared error, with a focus on classification problems where the squared error of probability estimates is considered. Two classes of algorithms for calculating weights are studied with respect to their ability to minimize the first-order term of the squared error: local linear regression and a new class termed regularized linear interp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009